AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
6.7 Billion Parameters

# 6.7 Billion Parameters

Fairseq Dense 6.7B
This is the Hugging Face transformers adaptation of the original dense 6.7B parameter model from the paper 'Efficient Large Scale Language Modeling with Mixtures of Experts' by Artetxe et al.
Large Language Model Transformers English
F
KoboldAI
123
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase